Products of Gaussians

نویسندگان

  • Christopher K. I. Williams
  • Felix V. Agakov
  • Stephen N. Felderhof
چکیده

Recently Hinton (1999) has introduced the Products of Experts (PoE) model in which several individual probabilistic models for data are combined to provide an overall model of the data. Below we consider PoE models in which each expert is a Gaussian. Although the product of Gaussians is also a Gaussian, if each Gaussian has a simple structure the product can have a richer structure. We examine (1) Products of Gaussian pancakes which give rise to probabilistic Minor Components Analysis , (2) products of I-factor PPCA models and (3) a products of experts construction for an AR(l) process. Recently Hinton (1999) has introduced the Products of Experts (PoE) model in which several individual probabilistic models for data are combined to provide an overall model of the data. In this paper we consider PoE models in which each expert is a Gaussian. It is easy to see that in this case the product model will also be Gaussian. However, if each Gaussian has a simple structure, the product can have a richer structure. Using Gaussian experts is attractive as it permits a thorough analysis of the product architecture, which can be difficult with other models , e.g. models defined over discrete random variables. Below we examine three cases of the products of Gaussians construction: (1) Products of Gaussian pancakes (PoGP) which give rise to probabilistic Minor Components Analysis (MCA), providing a complementary result to probabilistic Principal Components Analysis (PPCA) obtained by Tipping and Bishop (1999); (2) Products of I-factor PPCA models; (3) A products of experts construction for an AR(l) process. Products of Gaussians If each expert is a Gaussian pi(xI8 i ) '" N(J1i' ( i), the resulting distribution of the product of m Gaussians may be expressed as By completing the square in the exponent it may be easily shown that p(xI8) N(/1;E, (2:), where (E l = 2::1 (i l . To simplify the following derivations we will assume that pi(xI8 i ) '" N(O, (i) and thus that p(xI8) '" N(O, (2:). J12: i ° can be obtained by translation of the coordinate system. 1 Products of Gaussian Pancakes A Gaussian "pancake" (GP) is a d-dimensional Gaussian, contracted in one dimension and elongated in the other d 1 dimensions. In this section we show that the maximum likelihood solution for a product of Gaussian pancakes (PoGP) yields a probabilistic formulation of Minor Components Analysis (MCA). 1.1 Covariance Structure of a GP Expert Consider a d-dimensional Gaussian whose probability contours are contracted in the direction w and equally elongated in mutually orthogonal directions VI , ... , vd-l.We call this a Gaussian pancake or GP. Its inverse covariance may be written as d l ( 1 = L ViV; /30 + wwT /3,;;, (1)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An efficient approximate method for solution of the heat equation using Laguerre-Gaussians radial functions

In the present paper, a numerical method is considered for solving one-dimensional heat equation subject to both Neumann and Dirichlet initial boundary conditions. This method is a combination of collocation method and radial basis functions (RBFs). The operational matrix of derivative for Laguerre-Gaussians (LG) radial basis functions is used to reduce the problem to a set of algebraic equatio...

متن کامل

Bi-radial Transfer Functions

The most common transfer functions in neural networks are of the sigmoidal type. In this article other transfer functions are considered. Advantages of simple gaussians, giving hyperelliptical densities, and gaussian bar functions (sums of one-dimensional gaussians) are discussed. Bi-radial functions are formed from products of two sigmoids. Product of M bi-radial functions in N -dimensional pa...

متن کامل

Agnostic Distribution Learning via Compression

We prove that Θ̃(kd2/ε2) samples are necessary and sufficient for learning a mixture of k Gaussians in Rd, up to error ε in total variation distance. This improves both the known upper bound and lower bound for this problem. For mixtures of axis-aligned Gaussians, we show that Õ(kd/ε2) samples suffice, matching a known lower bound. Moreover, these results hold in an agnostic learning setting as ...

متن کامل

Products of Gaussians and Probabilistic Minor Component Analysis

Recently, Hinton introduced the products of experts architecture for density estimation, where individual expert probabilities are multiplied and renormalized. We consider products of gaussian "pancakes" equally elongated in all directions except one and prove that the maximum likelihood solution for the model gives rise to a minor component analysis solution. We also discuss the covariance str...

متن کامل

Symmetry derivatives of Gaussians illustrated by cross tracking

We propose a family of complex differential operators, symmetry derivatives, for pattern recognition in images. We present three theorems on their properties as applied to Gaussians. These show that all orders of symmetry derivatives of Gaussians yield compact expressions obtained by replacing the original differential polynomial with an ordinary polynomial. Just like Gaussians, the symmetry de...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001